Apresentation.

alt

alt

Introduction.

alt

alt


  1. Artificial Neural Networks

Introduction.

alt

alt


  1. Multilayer Perceptron

Introduction.

alt

alt


  1. Deep Neural Networks:

Introduction.

alt

alt


Introduction.

alt

alt


  1. GoogleNet

Introduction.

alt

alt


  1. Long Short-Term Memory (LSTM)

Motivation.

alt

alt


  1. Deep Learning Challenges
  1. Neuro Evolution

Emergent Neural Networks.

alt

alt

Emergent Multilayer Perceptron.

alt

alt


  1. Our Proposal.

Multilayer Perceptron

alt

alt

Forward Propagation

alt

alt

Gradient Descent.

alt

alt

Back Propagation.

alt

alt

Back Propagation.

alt

alt

Back Propagation.

alt

alt

Component-based MLP.

alt

alt


  1. Components

Adaptation.

alt

alt


Perceptron.

alt

alt

Learning.

alt

alt


  1. getPerceptionData() : Leaning Accuracy
  2. setConfig(last_config) : Hidden Layer with different number of neurons

Demo.

alt

alt

Data.


Iris Dataset

  1. It includes three iris species with 50 samples each as well as some properties about each flower. One flower species is linearly separable from the other two, but the other two are not linearly separable from each other.

  2. The columns in this dataset are:

Class: - Iris Setosa - Iris Versicolour

The Code.

alt

alt

Questions.

alt

alt

---
title: "Adaptive Multilayer Perceptron"
output: 
  flexdashboard::flex_dashboard:
    storyboard: true
    social: menu
    source: embed
  
    
---


```{r setup, include=FALSE}
library(flexdashboard)
```

### Apresentation.

![alt](/Users/Fernanda/Downloads/first.png)

### Introduction.




![alt](/Users/Fernanda/Downloads/second.png)



***

1. Artificial Neural Networks
  - The neural network itself is not an algorithm, but rather a framework for many different   machine learning algorithms to work together and process complex data inputs.



###  Introduction.



![alt](/Users/Fernanda/Downloads/third.png)

*** 

1. Multilayer Perceptron

  - A Multilayer Perceptron is a class of feedforward artificial neural networking.
  





### Introduction.



![alt](/Users/Fernanda/Downloads/four.png)


***
1. Deep Neural Networks:

 - Computer vision;
 - Speech recognition;
 - Natural language processing.





### Introduction.


![alt](/Users/Fernanda/Downloads/five.png)

***

 - Convolutional Neural Network (CNN)


### Introduction.


![alt](/Users/Fernanda/Downloads/six.png)

***
1. GoogleNet



### Introduction.


![alt](/Users/Fernanda/Downloads/seven.png)


***

1. Long Short-Term Memory (LSTM)

- Are a special kind of RNN, capable of learning long-term dependencies.




### Motivation.


![alt](/Users/Fernanda/Downloads/eight.png)


***

1. Deep Learning Challenges 

- Best architecture for a specific task
- Unsupervised training
- Generalization for many tasks

2. Neuro Evolution

- Neuro Evolution is a subfield within artificial intelligence (AI) and machine learning (ML) that consists of trying to trigger an evolutionary process similar to the one that produced our brains, except inside a computer. In other words, neuroevolution seeks to develop the means of evolving neural networks through evolutionary algorithms.



### Emergent Neural Networks.

![alt](/Users/Fernanda/Downloads/nine.png)



### Emergent Multilayer Perceptron.

![alt](/Users/Fernanda/Downloads/31.png)


***
1. Our Proposal.

- Give the neural network the possibility of changing the number of neurons during the learning process;
- Find the best model;
- Increase learning accuracy;
- Decrease the error;
- Decrease training time.




### Multilayer Perceptron

![alt](/Users/Fernanda/Downloads/ten.png)



### Forward Propagation

![alt](/Users/Fernanda/Downloads/eleven.png)



### Gradient Descent.


![alt](/Users/Fernanda/Downloads/tw.png)




### Back Propagation.

![alt](/Users/Fernanda/Downloads/eq.png)


### Back Propagation.

![alt](/Users/Fernanda/Downloads/eq1.png)


### Back Propagation.

![alt](/Users/Fernanda/Downloads/eq2.png)



### Component-based MLP.


![alt](/Users/Fernanda/Downloads/21.png)

***

1. Components


### Adaptation.

![alt](/Users/Fernanda/Downloads/22.png)

***

 - Weights’ matrix is maintained while changing between variants;

 - It is necessary to check and transform the matrices dimensions according to the variants’ number of neurons.



### Perceptron.

![alt](/Users/Fernanda/Downloads/23.png)



### Learning.


![alt](/Users/Fernanda/Downloads/24.png)



***
1. getPerceptionData() : Leaning Accuracy
2. setConfig(last_config) : Hidden Layer with different number of neurons


### Demo.


![alt](/Users/Fernanda/Downloads/32.png)
    
    


### Data.

```{r}
pairs(iris[1:4], main = "Anderson's Iris Data -- 3 species", pch = 21, bg = c("red", "green3", "blue")[unclass(iris$Species)], lower.panel=NULL, labels=c("SL","SW","PL","PW"), font.labels=2, cex.labels=4.5) 
```


***

Iris Dataset


1. It includes three iris species with 50 samples each as well as some properties about each flower. One flower species is linearly separable from the other two, but the other two are not linearly separable from each other.

2. The columns in this dataset are:

- SepalLength
- SepalWidth
- PetalLength
- PetalWidth

 Class: 
        - Iris Setosa 
        - Iris Versicolour 



### The Code.


![alt](/Users/Fernanda/Downloads/25.png)



### Questions.

![alt](/Users/Fernanda/Downloads/12.png)